109 research outputs found

    Modelling and solving temporal reasoning as propositional satisfiability

    Get PDF
    AbstractRepresenting and reasoning about time dependent information is a key research issue in many areas of computer science and artificial intelligence. One of the best known and widely used formalisms for representing interval-based qualitative temporal information is Allen's interval algebra (IA). The fundamental reasoning task in IA is to find a scenario that is consistent with the given information. This problem is in general NP-complete.In this paper, we investigate how an interval-based representation, or IA network, can be encoded into a propositional formula of Boolean variables and/or predicates in decidable theories. Our task is to discover whether satisfying such a formula can be more efficient than finding a consistent scenario for the original problem. There are two basic approaches to modelling an IA network: one represents the relations between intervals as variables and the other represents the end-points of each interval as variables. By combining these two approaches with three different Boolean satisfiability (SAT) encoding schemes, we produced six encoding schemes for converting IA to SAT. In addition, we also showed how IA networks can be formulated into satisfiability modulo theories (SMT) formulae based on the quantifier-free integer difference logic (QF-IDL). These encodings were empirically studied using randomly generated IA problems of sizes ranging from 20 to 100 nodes. A general conclusion we draw from these experimental results is that encoding IA into SAT produces better results than existing approaches. More specifically, we show that the new point-based 1-D support SAT encoding of IA produces consistently better results than the other alternatives considered. In comparison with the six different SAT encodings, the SMT encoding came fourth after the point-based and interval-based 1-D support schemes and the point-based direct scheme. Further, we observe that the phase transition region maps directly from the IA encoding to each SAT or SMT encoding, but, surprisingly, the location of the hard region varies according to the encoding scheme. Our results also show a fixed performance ranking order over the various encoding schemes

    Additive versus multiplicative clause weighting for SAT

    Get PDF
    This paper examines the relative performance of additive and multiplicative clause weighting schemes for propositional satisfiability testing. Starting with one of the most recently developed multiplicative algorithms (SAPS), an experimental study was constructed to isolate the effects of multiplicative in comparison to additive weighting, while controlling other key features of the two approaches, namely the use of random versus flat moves, deterministic versus probabilistic weight smoothing and multiple versus single inclusion of literals in the local search neighborhood. As a result of this investigation we developed a pure additive weighting scheme (PAWS) which can outperform multiplicative weighting on a range of difficult problems, while requiring considerably less effort in terms of parameter tuning. W

    Algorithmic Foundations of Inexact Computing

    Full text link
    Inexact computing also referred to as approximate computing is a style of designing algorithms and computing systems wherein the accuracy of correctness of algorithms executing on them is deliberately traded for significant resource savings. Significant progress has been reported in this regard both in terms of hardware as well as software or custom algorithms that exploited this approach resulting in some loss in solution quality (accuracy) while garnering disproportionately high savings. However, these approaches tended to be ad-hoc and were tied to specific algorithms and technologies. Consequently, a principled approach to designing and analyzing algorithms was lacking. In this paper, we provide a novel model which allows us to characterize the behavior of algorithms designed to be inexact, as well as characterize opportunities and benefits that this approach offers. Our methods therefore are amenable to standard asymptotic analysis and provides a clean unified abstraction through which an algorithm's design and analysis can be conducted. With this as a backdrop, we show that inexactness can be significantly beneficial for some fundamental problems in that the quality of a solution can be exponentially better if one exploits inexactness when compared to approaches that are agnostic and are unable to exploit this approach. We show that such gains are possible in the context of evaluating Boolean functions rooted in the theory of Boolean functions and their spectra, PAC learning, and sorting. Formally, this is accomplished by introducing the twin concepts of inexactness aware and inexactness oblivious approaches to designing algorithms and the exponential gains are shown in the context of taking the ratio of the quality of the solution using the "aware" approach to the "oblivious" approach

    Complexation of 6-(4'-(toluidinyl)naphthalene-2-sulfonate by β-cyclodextrin and linked β-cyclodextrin dimers

    Get PDF
    The complexation of 6-(4′-(toluidinyl)naphthalene-2-sulfonate, TNS-, by β-cyclodextrin (βCD) and five linked βCD-dimers is characterized by UV-Vis, fluorescence and 1H NMR spectroscopy. In aqueous phosphate buffer at pH 7.0, I = 0.10 mol dm -3 and 2

    Civil Good - A Platform For Sustainable and Inclusive Online Discussion

    Get PDF
    Civil Good is a website concept proposed by Alan Mandel with the goal of enabling safe, anonymous, productive, and civil discourse without the disruptive behavior and language common to much of the Internet. The goal of Civil Good is to improve the critical thinking and discussion skills of its users while combating the effects of political polarization and misinformation in society. This paper analyzes Mandel\u27s proposed concept, providing additional research to either support or refute the various features proposed, and recommendations to simplify user interactions. It also examines topics mentioned only briefly or not discussed by Mandel, such as data protection methods, the psychology of Web browsing, marketing, operational costs, legal issues, monetization options, and mobile presence

    A model immunization programme to control Japanese encephalitis in Viet Nam.

    Get PDF
    In Viet Nam, an inactivated, mouse brain-derived vaccine for Japanese encephalitis (JE) has been given exclusively to ≤ 5 years old children in 3 paediatric doses since 1997. However, JE incidence remained high, especially among children aged 5-9 years. We conducted a model JE immunization programme to assess the feasibility and impact of JE vaccine administered to 1-9 year(s) children in 3 standard-dose regimen: paediatric doses for children aged <3 years and adult doses for those aged ≥ 3 years. Of the targeted children, 96.2% were immunized with ≥ 2 doses of the vaccine. Compared to the national immunization programme, JE incidence rate declined sharply in districts with the model programme (11.32 to 0.87 per 100,000 in pre-versus post-vaccination period). The rate of reduction was most significant in the 5-9 years age-group. We recommend a policy change to include 5-9 years old children in the catch-up immunization campaign and administer a 4th dose to those aged 5-9 years, who had received 3 doses of the vaccine during the first 2-3 years of life
    • …
    corecore